Routine Data Quality Assessment (RDQA)

                                           Checklist to Assess Program/Project Data Quality


                                           Number of Regional Aggregation Sites                                      þÿ1

                                           Number of District Aggregation Sites                                      þÿ1

                                           Number of Service Delivery Sites                                          þÿ1




                                                                   Version: Jan 2010



Important notes for the use of this spreadsheet:


1. In order to use the Routine Data Quality Assessment tool you will need to ensure that your 'macro security' is set to something less than 'high'. With the spreadsheet open, go
to the 'Tools' pull-down menu and select 'Macro', then 'Security'. Select 'medium'. Close Excel and re-open the file. When you open the file the next time you will have to select
'Enable Macros' for the application to work as designed.

2. On the START Page (this page), please select number of intermediate aggregation sites (IAS) and Service Delivery Points (SDPs) that you plan to review from the dropdown
lists above. IAS are typically the district level health unit of the Ministry of Health.




                                                                                     START                                                                                    Page 1
B – INSTRUCTIONS FOR USE OF THE RDQA
1. Determine Purpose

The RDQA checklist can be used for:

T Initial assessment of M&E systems established by new implementing partners (or in decentralized systems) to collect, manage and report data.

I Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period
worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess
the functioning of the entire Program/project’s M&E system.

t Periodic assessment by donors of the quality of data being provided to them (this use of the DQA could be more frequent and more streamlined than official data quality audits that use the
DQA for Auditing) but less frequent than routine monitoring of data.

D Preparation for a formal data quality audit.

The RDQA is flexible for all of these uses. Countries and programs are encouraged to adapt the checklist to fit local program contexts.




2. Level/Site Selection
Select levels and sites to be included (depending on the purpose and resources available). Once the purpose has been determined, the second step in the RDQA is to decide what levels of
the data-collection and reporting system will be included in the assessment - service sites, intermediate aggregation levels, and/or central M&E unit. The levels should be determined once
the appropriate reporting levels have been identified and “mapped” (e.g., there are 100 sites providing the services in 10 districts. Reports from sites are sent to districts, which then send
aggregated reports to the M&E Unit). In some cases, the data flow will include more than one intermediate level (e.g. regions, provinces or states or multiple levels of program organizations).




3. Identify indicators, data sources and reporting period.                                                                                                                           The
RDQA is designed to assess the quality of data and underlying systems related to indicators that are reported to programs or donors. It is necessary to select one or more indicators – or at
least program areas – to serve as the subject of the RDQA. This choice will be based on the list of reported indicators. For example, a program focusing on treatment for HIV may report
indicators of numbers of people on ART. Another program may focus on meeting the needs of orphans or vulnerable children, therefore the indicators for that program would be from the OVC
program area. A malaria program might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria – or on both of those activities.




4. Conduct site visits. During the site visits, the relevant sections of the appropriate checklists in the Excel file are filled out (e.g. the service site checklist at service sites, etc). These
checklists are completed following interviews of relevant staff and reviews of site documentation. Using the drop down lists on the HEADER page of this workbook, select the appropriate
number of Intermediate Aggregation Levels (IAL) and Service Delivery Points (SDP) to be reviewed. The appropriate number of worksheets will automatically appear in the RDQA workbook
(up to 12 SDP and 4 IALs).

5. Review outputs and findings. The RDQAoutputs need to be reviewed for each site visited. Site-specific summary findings in the form of recommendations are noted at each site visited.




The RDQA checklists exist in MS Excel format and responses can be entered directly into the spreadsheets on the computer. Alternatively, the checklists can be printed and completed by
hand. When completed electronically, a dashboard produces graphics of summary statistics for each site and level of the reporting system.
The dashboard displays two (2) graphs for each site visited:


- A spider-graph displays qualitative data generated from the assessment of the data-collection and reporting system and can be used to prioritize areas for improvement.
- A bar-chart shows the quantitative data generated from the data verifications; these can be used to plan for data quality improvement.


 In addition, a 'Global Dashboard' shows statistics aggregated across and within levels to highlight overall strengths and weaknesses in the reporting system. The Global Dashboard shows a
spider graph for qualitative assessments and a bar chart for quantitative assessments as above. In addition, stengths and weakness of the reporting system are displayed as dimensions of
data quality in a 100% stacked bar chart. For this analysis questions are grouped by the applicable dimension of data quality (e.g. accuracy or reliability) and the number of responses by type
of response (e.g. 'Yes - completely', 'Partly' etc.) are plotted as a percentage of all responses. A table of survey questions and their associated dimensions of data quality can be found on the
'Dimensions of data quality' tab in this workbook.



6. Develop a system’s strengthening plan, including follow-up actions. The final output of the RDQA is an action plan for improving data quality which describes the identified
strengthening measures, the staff responsible, the timeline for completion, resources required and follow-up. Using the graphics and the detailed comments for each question, weak
performing functional areas of the reporting system can be identified. Program staff can then outline strengthening measures (e.g. training, data reviews), assign responsibilities and timelines
and identify resources using the Action Plan tab in this workbook.




                                                                                         INSTRUCTIONS                                                                                           Page 2
C – BACKGROUND INFORMATION – RDQA


Country:



Name of Program/project:



Indicator Reviewed:



Reporting Period Verified:



Assessment Team:                        Name                                                 Title                          Email

                Primary contact:




                                             M&E Management Unit at Central Level

                      Name of Site   Facility Code                                                                              Date (mm/dd/yy)

1-

                                                Regional Level Aggregation Sites

                      Name of Site   Facility Code                                                   Region   Region Code       Date (mm/dd/yy)

1

                                                 District Level Aggregation Sites

                      Name of Site   Facility Code          District         District Code           Region   Region Code       Date (mm/dd/yy)

1

                                                 Service Delivery Points (SDPs)

                      Name of Site   Facility Code          District         District Code           Region   Region Code       Date (mm/dd/yy)

1




                                                          Information_Page                                                                    Page 3
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -
                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                            N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 1                                                                                Page 4
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 1   Page 5
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                        1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                   1000%



                                                 2.00
    II- Indicator                                                                                       800%
                                                                             V - Links with
    Definitions and
                                                 1.00                        National
    Reporting
                                                                             Reporting System
    Guidelines
                                                                                                        600%
                                                 0.00



                                                                                                        400%




                                                                                                        200%
        III - Data-collection                                    IV- Data Management
        and Reporting Forms                                      Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 1                                                                                Page 6
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 2                                                                                Page 7
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 2   Page 8
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 2                                                                                   Page 9
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 3                                                                               Page 10
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 3   Page 11
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 3                                                                               Page 12
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 4                                                                               Page 13
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 4   Page 14
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 4                                                                               Page 15
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 5                                                                               Page 16
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 5   Page 17
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 5                                                                               Page 18
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 6                                                                               Page 19
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 6   Page 20
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 6                                                                               Page 21
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 7                                                                               Page 22
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 7   Page 23
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 7                                                                               Page 24
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 8                                                                               Page 25
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 8   Page 26
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 8                                                                               Page 27
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 9                                                                               Page 28
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 9   Page 29
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 9                                                                               Page 30
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 10                                                                              Page 31
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 10   Page 32
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 10                                                                               Page 33
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 11                                                                              Page 34
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 11   Page 35
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                       Responsible(s)               Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                       Data and Reporting Verifications -
                                                                                                                                  Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 11                                                                               Page 36
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 12                                                                              Page 37
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 12   Page 38
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 12                                                                               Page 39
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 13                                                                              Page 40
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 13   Page 41
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 13                                                                               Page 42
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 14                                                                              Page 43
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 14   Page 44
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 14                                                                               Page 45
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 15                                                                              Page 46
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 15   Page 47
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 15                                                                               Page 48
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 16                                                                              Page 49
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 16   Page 50
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                       Responsible(s)               Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                       Data and Reporting Verifications -
                                                                                                                                  Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 16                                                                               Page 51
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 17                                                                              Page 52
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 17   Page 53
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                       Responsible(s)               Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                       Data and Reporting Verifications -
                                                                                                                                  Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 17                                                                               Page 54
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 18                                                                              Page 55
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 18   Page 56
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 18                                                                               Page 57
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 19                                                                              Page 58
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 19   Page 59
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                        Responsible(s)              Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                        Data and Reporting Verifications -
                                                                                                                                   Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 19                                                                               Page 60
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 20                                                                              Page 61
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 20   Page 62
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 20                                                                               Page 63
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 21                                                                              Page 64
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 21   Page 65
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 21                                                                               Page 66
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 22                                                                              Page 67
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 22   Page 68
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 22                                                                               Page 69
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 23                                                                              Page 70
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 23   Page 71
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 23                                                                               Page 72
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 24                                                                              Page 73
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 24   Page 74
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                         Responsible(s)             Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                         Data and Reporting Verifications -
                                                                                                                                    Service Delivery Point
                                                                                                     1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 24                                                                               Page 75
Service Delivery Site Summary Statistics

                              Data Management Assessment -                                                            Data and Reporting Verifications - Service Site Summary
                                   Service Site Summary
                                                                                                                1



                                     M&E Structure,                                                            0.9
                                     Functions and
                                     Capabilities
                                                                                                               0.8

                                             3
                                                                                                               0.7

                                             2
                                                                                                               0.6
Indicator




                                                                                             Number of Sites
                                                                      Links with
Definitions
                                                                      National Reporting
and Reporting                                1
                                                                      System                                   0.5
Guidelines


                                             0                                                                 0.4


                                                                                                               0.3


                                                                                                               0.2


                                                                                                               0.1
            Data-collection                              Data
            and Reporting                                Management
            Forms / Tools                                Processes                                              0
                                                                                                                     <=70   71-80    81-90     91-100   101-110   111-120   121-130   >130

                                                                                                                                             Percent Accuracy




                                                                                 Service Site Summary                                                                                        Page 76
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 1                                                                 Page 77
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 1   Page 78
Part 3: Recommendations for the District Site

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


       Identified Weaknesses                                                               Description of Action Point                      Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment -                                                                Data and Reporting Verifications -
                                           District Site                                                                                District Site
                                                                                                    1200%


                                      I - M&E Structure,
                                      Functions and Capabilities
                                                                                                    1000%
                                                    3


                                                    2                                                800%
    II- Indicator
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                       1
                                                                             Reporting System
    Guidelines
                                                                                                     600%
                                                    0



                                                                                                     400%




                                                                                                     200%
          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                       0%
                                                                                                             Verification factor    % Available       % On Time           % Complete




                                                                                District Site 1                                                               Page 79
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 2                                                                 Page 80
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 2   Page 81
Part 3: Recommendations for the District Site

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


       Identified Weaknesses                                                               Description of Action Point                      Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment -                                                                Data and Reporting Verifications -
                                           District Site                                                                                District Site
                                                                                                    1200%


                                      I - M&E Structure,
                                      Functions and Capabilities
                                                                                                    1000%
                                                    3



                                                    2                                                800%
    II- Indicator
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                       1
                                                                             Reporting System
    Guidelines
                                                                                                     600%
                                                    0



                                                                                                     400%




                                                                                                     200%
          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                       0%
                                                                                                             Verification factor    % Available       % On Time          % Complete




                                                                                District Site 2                                                               Page 82
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 3                                                                 Page 83
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 3   Page 84
Part 3: Recommendations for the District Site

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


       Identified Weaknesses                                                               Description of Action Point                      Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment -                                                                Data and Reporting Verifications -
                                           District Site                                                                                District Site
                                                                                                    1200%


                                      I - M&E Structure,
                                      Functions and Capabilities
                                                                                                    1000%
                                                    10
                                                    9
                                                    8
                                                    7
                                                    6                                                800%
    II- Indicator                                   5                        V - Links with
    Definitions and                                 4                        National
    Reporting                                       3                        Reporting System
    Guidelines                                      2
                                                    1                                                600%
                                                    0



                                                                                                     400%




                                                                                                     200%
          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                       0%
                                                                                                             Verification factor    % Available       % On Time           % Complete




                                                                                District Site 3                                                               Page 85
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 4                                                                 Page 86
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 4   Page 87
Part 3: Recommendations for the District Site

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


       Identified Weaknesses                                                               Description of Action Point                      Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment -                                                                 Data and Reporting Verifications -
                                           District Site                                                                                 District Site
                                                                                                    1200%


                                      I - M&E Structure,
                                      Functions and Capabilities
                                                                                                    1000%
                                                    3



                                                    2                                                800%
    II- Indicator
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                       1
                                                                             Reporting System
    Guidelines
                                                                                                     600%
                                                    0



                                                                                                     400%




                                                                                                     200%
          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                       0%
                                                                                                             Verification factor     % Available       % On Time           % Complete




                                                                                District Site 4                                                               Page 88
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 5                                                                 Page 89
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 5   Page 90
Part 3: Recommendations for the District Site

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


       Identified Weaknesses                                                               Description of Action Point                            Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment - -
                                   Data Management Assessment                                                           Data and Reporting Verifications - Verifications -
                                                                                                                                 Data and Reporting
                                            District Site
                                            District Site                                                                          District Site District Site

                                      I - M&E Structure,                                            1200%
                                                                                                    1200%
                                      Functions and Capabilities

                                      I - M&E Structure,
                                                    10
                                      Functions and 9
                                                    Capabilities                                    1000%
                                                    8                                               1000%
                                                    7
                                                    3
                                                    6
    II- Indicator                                   5                                                800%
                                                                             V - Links with
    Definitions and                                 4
                                                    2                        National
    Reporting                                       3                                                800%
                                                                             Reporting System
    Guidelines
    II- Indicator                                   2
                                                                             V - Links with          600%
    Definitions and                                 1
                                                                             National
    Reporting                                       1
                                                    0
                                                                             Reporting System
    Guidelines
                                                                                                     600%
                                                    0                                                400%




                                                                                                      400%
                                                                                                     200%



          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes                          0%
          and Tools                                                                                  200% Verification factor   % Available    % On Time      % Complete
          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                       0%
                                                                                                               Verification factor        % Available          % On Time         % Complete




                                                                                District Site 5                                                                        Page 91
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 6                                                                 Page 92
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 6   Page 93
Part 3: Recommendations for the District Site

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


       Identified Weaknesses                                                               Description of Action Point                              Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment - -
                                   Data Management Assessment                                                            Data and Data andVerifications -Verifications
                                                                                                                                  Reporting Reporting                    -
                                            District Site
                                            District Site                                                                           District SiteDistrict Site

                                      I - M&E Structure,                                            1200%
                                                                                                    120%
                                      Functions and Capabilities

                                                    10
                                      I - M&E Structure,
                                                                                                    1000%
                                      Functions and 9
                                                    Capabilities
                                                    8                                               100%
                                                    7
                                                    10
                                                    6
    II- Indicator                                   9
                                                    5                                                800%
                                                    8                        V - Links with
    Definitions and                                 4
                                                    7                        National
    Reporting                                       3
                                                    6                        Reporting System        80%
    Guidelines                                      2
    II- Indicator                                   5
                                                    1                        V - Links with          600%
    Definitions and                                 4
                                                    0                        National
    Reporting                                       3                        Reporting System
    Guidelines                                      2
                                                    1                                                60%
                                                                                                     400%
                                                    0



                                                                                                     200%
                                                                                                      40%


          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes                          0%
          and Tools                                                                                  20%    Verification factor   % Available    % On Time   % Complete

          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                      0%
                                                                                                              Verification factor          % Available       % On Time         % Complete




                                                                                District Site 6                                                                      Page 94
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 7                                                                 Page 95
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 7   Page 96
Part 3: Recommendations for the District Site

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


       Identified Weaknesses                                                               Description of Action Point                             Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment - -
                                   Data Management Assessment                                                            Data and Reporting Verifications - Verifications -
                                                                                                                                  Data and Reporting
                                            District Site
                                            District Site                                                                           District Site District Site

                                      I - M&E Structure,                                            1200%
                                                                                                    120%
                                      Functions and Capabilities

                                      I - M&E Structure,
                                                    10
                                      Functions and 9
                                                    Capabilities                                    1000%
                                                    8                                               100%
                                                    7
                                                    10
                                                    6
    II- Indicator                                   5                                                800%
                                                    8                        V - Links with
    Definitions and                                 4
                                                                             National
    Reporting                                       3
                                                    6                                                80%
                                                                             Reporting System
    Guidelines
    II- Indicator                                   2
                                                                             V - Links with          600%
    Definitions and                                 1
                                                    4                        National
    Reporting                                       0                        Reporting System
    Guidelines                                      2
                                                                                                     60%
                                                    0                                                400%




                                                                                                      40%
                                                                                                     200%



          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes                          0%
          and Tools                                                                                  20%    Verification factor   % Available    % On Time     % Complete
          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                      0%
                                                                                                              Verification factor          % Available         % On Time          % Complete




                                                                                District Site 7                                                                         Page 97
Data Verification and System Assessment Sheet - District Site
                                                   District Site/Organization:                                                             -

                                                           Region and District:                                                            -

                                                            Indicator Reviewed:                                                            -

                                                                  Date of Review:                                                          -

                                                   Reporting Period Verified:                                                              -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from service sites to the District and
compare to the value reported by the District. Explain discrepancies (if any).

       Re-aggregate the numbers from the reports received from all Service Delivery
1
       Points. What is the re-aggregated number? [A]

       What aggregated result was contained in the summary report prepared by the
2
       District (and submitted to the next reporting level)? [B]


3      Calculate the ratio of recounted to reported numbers. [A/B]                                -


       What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
       errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Service Delivery
Sites. How many reports should there have been from all Sites? How many are
there? Were they received on time? Are they complete?


5      How many reports should there have been from all service sites? [A]


6      How many reports are there? [B]


7      Calculate % Available Reports [B/A]                                                        -


       Check the dates on the reports received. How many reports were received
8
       on time? (i.e., received by the due date). [C]


9      Calculate % On time Reports [C/A]                                                          -


       How many reports were complete? (i.e., complete means that the report
10
       contained all the required indicator data*). [D]


11     Calculate % Complete Reports [D/A]                                                         -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.




                                                                                District Site 8                                                                 Page 98
II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.

    V - Links with National Reporting System

       When applicable, the data are reported through a single channel of the
17
       national reporting system.

       When available, the relevant national forms/tools are used for data-collection
21
       and reporting.

       The system records information about where the service is delivered (i.e.
22
       region, district, ward, etc.)


23       ….if yes, place names are recorded using standarized naming conventions.




                                                                                 District Site 8   Page 99
Part 3: Recommendations for the District Site


Based on the findings of the systems’ review and data verification at the District site, please describe any compliance requirements or recommended strengthening measures, with an estima
length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system. Action points should be discussed with the Pro


       Identified Weaknesses                                                               Description of Action Point                      Responsible(s)

1



2



3



4




Part 4: DASHBOARD: District Site



                                  Data Management Assessment -                                                                Data and Reporting Verifications -
                                           District Site                                                                                District Site
                                                                                                    1200%


                                      I - M&E Structure,
                                      Functions and Capabilities
                                                                                                    1000%
                                                    3



                                                    2                                                800%
    II- Indicator
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                       1
                                                                             Reporting System
    Guidelines
                                                                                                     600%
                                                    0




                                                                                                     400%




                                                                                                     200%
          III - Data-collection                                    IV- Data Management
          and Reporting Forms                                      Processes
          and Tools

                                                                                                       0%
                                                                                                             Verification factor    % Available       % On Time           % Complete




                                                                                District Site 8                                                               Page 100
District Site Summary Statistics

                  Data Management Assessment - District Level                                       Data and Reporting Verifications - District Level Summary
                                  Summary
                                                                                           1200%



                                M&E Structure,
                                Functions and
                                Capabilities                                               1000%


                                        3

                                                                                            800%
                                        2
Indicator
                                                                  Links with
Definitions
                                                                  National Reporting
and Reporting                           1                         System
Guidelines                                                                                  600%

                                        0


                                                                                            400%




                                                                                            200%
           Data-collection                           Data
           and Reporting                             Management
           Forms / Tools                             Processes

                                                                                               0%
                                                                                                    % Available       % On Time        % Complete      Verification Factor




                                                                                District Summary                                                                         Page 101
Data Verification and System Assessment Sheet - Regional Site
                                                Regional Site/Organization:                                                                -

                                                                            Region:                                                        -

                                                           Indicator Reviewed:                                                             -

                                                                 Date of Review:                                                           -

                                                  Reporting Period Verified:                                                               -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from the Districts to the Region and
compare to the value reported by the Region. Explain discrepancies (if any).


      Re-aggregate the numbers from the reports received from all Service Delivery
1
      Points. What is the re-aggregated number? [A]

      What aggregated result was contained in the summary report prepared by the
2
      Intermediate Aggregation Site (and submitted to the next reporting level)? [B]


3     Calculate the ratio of recounted to reported numbers. [A/B]                                 -


      What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
      errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Districts within
the Region. How many reports should there have been from all Districts? How
many are there? Were they received on time? Are they complete?


5     How many reports should there have been from all Districts? [A]


6     How many reports are there? [B]


7     Calculate % Available Reports [B/A]                                                         -


      Check the dates on the reports received. How many reports were received
8
      on time? (i.e., received by the due date). [C]


9     Calculate % On time Reports [C/A]                                                           -


      How many reports were complete? (i.e., complete means that the report
10
      contained all the required indicator data*). [D]


11    Calculate % Complete Reports [D/A]                                                          -




                                                                               Regional Site 1                                                                 Page 102
Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.

    II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.




                                                                                Regional Site 1   Page 103
When available, the relevant national forms/tools are used for data-collection
21
        and reporting.

        The system records information about where the service is delivered (i.e.
22
        region, district, ward, etc.)


23        ….if yes, place names are recorded using standarized naming conventions.




Part 3: Recommendations for the Intermediate Aggregation Level

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


        Identified Weaknesses                                                              Description of Action Point                      Responsible(s)

1



2



3



4




Part 4: DASHBOARD: Intermediate Aggregation Level



                                    Data Management Assessment -                                                               Data and Reporting Verifications -
                                            Regional Site                                                                                Regional Site
                                                                                                       1200%

                                        I - M&E Structure,
                                        Functions and Capabilities

                                                                                                       1000%
                                                      3



                                                      2
                                                                                                        800%
     II- Indicator
                                                                                 V - Links with
     Definitions and
                                                                                 National
     Reporting                                        1
                                                                                 Reporting System
     Guidelines
                                                                                                        600%
                                                      0



                                                                                                        400%




                                                                                                        200%
            III - Data-collection                                    IV- Data Management
            and Reporting Forms                                      Processes
            and Tools
                                                                                                          0%
                                                                                                               Verification factor    % Available      % On Time          % Complete




                                                                               Regional Site 1                                                               Page 104
Data Verification and System Assessment Sheet - Regional Site
                                                Regional Site/Organization:                                                                -

                                                                            Region:                                                        -

                                                           Indicator Reviewed:                                                             -

                                                                 Date of Review:                                                           -

                                                  Reporting Period Verified:                                                               -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from the Districts to the Region and
compare to the value reported by the Region. Explain discrepancies (if any).


      Re-aggregate the numbers from the reports received from all Service Delivery
1
      Points. What is the re-aggregated number? [A]

      What aggregated result was contained in the summary report prepared by the
2
      Intermediate Aggregation Site (and submitted to the next reporting level)? [B]


3     Calculate the ratio of recounted to reported numbers. [A/B]                                 -


      What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
      errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Districts within
the Region. How many reports should there have been from all Districts? How
many are there? Were they received on time? Are they complete?


5     How many reports should there have been from all Districts? [A]


6     How many reports are there? [B]


7     Calculate % Available Reports [B/A]                                                         -


      Check the dates on the reports received. How many reports were received
8
      on time? (i.e., received by the due date). [C]


9     Calculate % On time Reports [C/A]                                                           -


      How many reports were complete? (i.e., complete means that the report
10
      contained all the required indicator data*). [D]


11    Calculate % Complete Reports [D/A]                                                          -




                                                                               Regional Site 2                                                                 Page 105
Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.

    II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.




                                                                                Regional Site 2   Page 106
When available, the relevant national forms/tools are used for data-collection
21
        and reporting.

        The system records information about where the service is delivered (i.e.
22
        region, district, ward, etc.)


23        ….if yes, place names are recorded using standarized naming conventions.




Part 3: Recommendations for the Intermediate Aggregation Level

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


        Identified Weaknesses                                                              Description of Action Point                     Responsible(s)

1



2



3



4




Part 4: DASHBOARD: Intermediate Aggregation Level



                                    Data Management Assessment -                                                                Data and Reporting Verifications -
                                            Regional Site                                                                                 Regional Site
                                                                                                         1200%

                                        I - M&E Structure,
                                        Functions and Capabilities

                                                                                                         1000%
                                                      3



                                                      2
                                                                                                          800%
     II- Indicator
                                                                                    V - Links with
     Definitions and
                                                                                    National
     Reporting                                        1
                                                                                    Reporting System
     Guidelines
                                                                                                          600%
                                                      0



                                                                                                          400%




                                                                                                          200%

            III - Data-collection                                     IV- Data Management
            and Reporting Forms                                       Processes
            and Tools
                                                                                                            0%
                                                                                                                 Verification factor   % Available      % On Time         % Complete




                                                                               Regional Site 2                                                              Page 107
Data Verification and System Assessment Sheet - Regional Site
                                                Regional Site/Organization:                                                                -

                                                                            Region:                                                        -

                                                           Indicator Reviewed:                                                             -

                                                                 Date of Review:                                                           -

                                                  Reporting Period Verified:                                                               -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from the Districts to the Region and
compare to the value reported by the Region. Explain discrepancies (if any).


      Re-aggregate the numbers from the reports received from all Service Delivery
1
      Points. What is the re-aggregated number? [A]

      What aggregated result was contained in the summary report prepared by the
2
      Intermediate Aggregation Site (and submitted to the next reporting level)? [B]


3     Calculate the ratio of recounted to reported numbers. [A/B]                                 -


      What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
      errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Districts within
the Region. How many reports should there have been from all Districts? How
many are there? Were they received on time? Are they complete?


5     How many reports should there have been from all Districts? [A]


6     How many reports are there? [B]


7     Calculate % Available Reports [B/A]                                                         -


      Check the dates on the reports received. How many reports were received
8
      on time? (i.e., received by the due date). [C]


9     Calculate % On time Reports [C/A]                                                           -


      How many reports were complete? (i.e., complete means that the report
10
      contained all the required indicator data*). [D]


11    Calculate % Complete Reports [D/A]                                                          -




                                                                               Regional Site 3                                                                 Page 108
Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.

    II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.




                                                                                Regional Site 3   Page 109
When available, the relevant national forms/tools are used for data-collection
21
        and reporting.

        The system records information about where the service is delivered (i.e.
22
        region, district, ward, etc.)


23         ….if yes, place names are recorded using standarized naming conventions.




Part 3: Recommendations for the Intermediate Aggregation Level

Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur
an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc
with the Program.


        Identified Weaknesses                                                              Description of Action Point                             Responsible(s)

1



2



3



4




Part 4: DASHBOARD: Intermediate Aggregation Level



                                   Data Management Assessment -                                                          Data and Reporting Verifications -
                                           Regional Site                                                                           Regional Site
                                                                                                    1200%



                                       I - M&E Structure,
                                       Functions and Capabilities
                                                                                                    1000%

                                                     3


                                                     2                                               800%

     II- Indicator
                                                                              V - Links with
     Definitions and
                                                                              National
     Reporting                                       1
                                                                              Reporting System
     Guidelines
                                                                                                     600%
                                                     0



                                                                                                     400%




                                                                                                     200%
           III - Data-collection                                    IV- Data Management
           and Reporting Forms                                      Processes
           and Tools

                                                                                                      0%
                                                                                                            Verification factor   % Available    % On Time    % Complete




                                                                                Regional Site 3                                                                     Page 110
Data Verification and System Assessment Sheet - Regional Site
                                                Regional Site/Organization:                                                                -

                                                                            Region:                                                        -

                                                           Indicator Reviewed:                                                             -

                                                                 Date of Review:                                                           -

                                                  Reporting Period Verified:                                                               -

                                                                                         Answer Codes: Yes -
                                                                                                                                            REVIEWER COMMENTS
                                                                                               completely
                    Component of the M&E System                                                   Partly
                                                                                                                (Please provide detail for each response not coded "Yes - Completely". Deta
                                                                                                                              responses will help guide strengthening measures. )
                                                                                         No - not at all  N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from the Districts to the Region and
compare to the value reported by the Region. Explain discrepancies (if any).


      Re-aggregate the numbers from the reports received from all Service Delivery
1
      Points. What is the re-aggregated number? [A]

      What aggregated result was contained in the summary report prepared by the
2
      Intermediate Aggregation Site (and submitted to the next reporting level)? [B]


3     Calculate the ratio of recounted to reported numbers. [A/B]                                 -


      What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
      errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Districts within
the Region. How many reports should there have been from all Districts? How
many are there? Were they received on time? Are they complete?


5     How many reports should there have been from all Districts? [A]


6     How many reports are there? [B]


7     Calculate % Available Reports [B/A]                                                         -


      Check the dates on the reports received. How many reports were received
8
      on time? (i.e., received by the due date). [C]


9     Calculate % On time Reports [C/A]                                                           -


      How many reports were complete? (i.e., complete means that the report
10
      contained all the required indicator data*). [D]


11    Calculate % Complete Reports [D/A]                                                          -




                                                                               Regional Site 4                                                                 Page 111
Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities
       There are designated staff responsible for reviewing the quality of data (i.e.,
1      accuracy, completeness and timeliness) received from sub-reporting levels
       (e.g., service points).

       There are designated staff responsible for reviewing aggregated numbers
2
       prior to submission to the next level (e.g., to the central M&E Unit).

       All relevant staff have received training on the data management processes
3
       and tools.

    II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4        ,,, what they are supposed to report on.


5        … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7        … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels

       ….The standard forms/tools are consistently used by the Service Delivery
10
       Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

    IV- Data Management Processes

       Feedback is systematically provided to all service points on the quality of their
12
       reporting (i.e., accuracy, completeness and timeliness).

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

       If yes, the latest date of back-up is appropriate given the frequency of update
15
       of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

       There is a written procedure to address late, incomplete, inaccurate and
19     missing reports; including following-up with service points on data quality
       issues.
       If data discrepancies have been uncovered in reports from service points, the
20     Intermediate Aggregation Levels (e.g., districts or regions) have documented
       how these inconsistencies have been resolved.




                                                                                Regional Site 4   Page 112
When available, the relevant national forms/tools are used for data-collection
21
        and reporting.

        The system records information about where the service is delivered (i.e.
22
        region, district, ward, etc.)


23         ….if yes, place names are recorded using standarized naming conventions.




Part 3: Recommendations for the Intermediate Aggregation Level

Based on the findings of the systems’ review and data verification at the Regional site, please describe any compliance requirements or recommended strengthening measures, with an estim
the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system. Action points should be discussed with the
Program.


        Identified Weaknesses                                                              Description of Action Point                             Responsible(s)

1



2



3



4




Part 4: DASHBOARD: Intermediate Aggregation Level



                                   Data Management Assessment -                                                          Data and Reporting Verifications -
                                           Regional Site                                                                           Regional Site
                                                                                                    1200%



                                       I - M&E Structure,
                                       Functions and Capabilities
                                                                                                    1000%

                                                     3


                                                     2                                               800%

     II- Indicator
                                                                              V - Links with
     Definitions and
                                                                              National
     Reporting                                       1
                                                                              Reporting System
     Guidelines
                                                                                                     600%
                                                     0



                                                                                                     400%




                                                                                                     200%
           III - Data-collection                                    IV- Data Management
           and Reporting Forms                                      Processes
           and Tools

                                                                                                      0%
                                                                                                            Verification factor   % Available    % On Time    % Complete




                                                                                Regional Site 4                                                                     Page 113
Regional Site Summary Statistics

                             Data Management Assessment -                                             Data and Reporting Verifications - Regional Level Summary
                                Regional Level Summary
                                                                                          1200%



                                    M&E Structure,
                                    Functions and
                                    Capabilities                                          1000%


                                            3

                                                                                           800%
                                            2
Indicator
                                                                    Links with
Definitions
                                                                    National Reporting
and Reporting                               1                       System
Guidelines                                                                                 600%

                                            0


                                                                                           400%




                                                                                           200%
           Data-collection                             Data
           and Reporting                               Management
           Forms / Tools                               Processes

                                                                                              0%
                                                                                                      % Available       % On Time        % Complete      Verification Factor




                                                                                   Regional Summary                                                                        Page 114
Data Verification and System Assessment Sheet - National Level M&E Unit

                                  National Level M&E Unit/Organization:                                                                           -

                                                            Indicator Reviewed:                                                                   -

                                                                  Date of Review:                                                                 -

                                                    Reporting Period Verified:                                                                    -

                                                                                          Answer Codes: Yes
                                                                                              - completely                                        REVIEWER COMMENTS
                     Component of the M&E System                                                  Partly              (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                           No - not at all                          responses will help guide strengthening measures. )
                                                                                                   N/A




Part 1: Data Verifications

A - Recounting reported Results:

Recount results from the periodic reports sent from the intermediate aggregation
sites to the National Level and compare to the value published by the National
Program (or reported by the National Program to the Donor, if applicable). Explain
discrepancies (if any).

        Re-aggregate the numbers from the reports received from all reporting
1
        entities. What is the re-aggregated number? [A]

        What aggregated result was contained in the summary report prepared by the
2
        M&E Unit? [B]


3       Calculate the ratio of recounted to reported numbers. [A/B]                                 -


        What are the reasons for the discrepancy (if any) observed (i.e., data entry
4
        errors, arithmetic errors, missing source documents, other)?

B - Reporting Performance:

Review availability, completeness, and timeliness of reports from all Intermediate
Aggregation Sites. How many reports should there have been from all Aggregation
Sites? How many are there? Were they received on time? Are they complete?

        How many reports should there have been from all reporting entities (e.g.,
5
        regions, districts, service points)? [A]


6       How many reports are there? [B]


7       Calculate % Available Reports [B/A]                                                         -


        Check the dates on the reports received. How many reports were received
8
        on time? (i.e., received by the due date). [C]


9       Calculate % On time Reports [C/A]                                                           -


        How many reports were complete? (i.e., complete means that the report
10
        contained all the required indicator data*). [D]


11      Calculate % Complete Reports [D/A]                                                          -




Part 2. Systems Assessment

    I - M&E Structure, Functions and Capabilities

        There is a documented organizational structure/chart that clearly identifies
1       positions that have data management responsibilities at the M&E Unit. (to
        specify which Unit: e.g. MoH, NAP, GF, World Bank)

2       All staff positions dedicated to M&E and data management systems are filled.

        A senior staff member (e.g., the Program Manager) is responsible for
3       reviewing the aggregated numbers prior to the submission/release of reports
        from the M&E Unit.

        There are designated staff responsible for reviewing the quality of data (i.e.,
4       accuracy, completeness, timeliness and confidentiality ) received from sub-
        reporting levels (e.g., regions, districts, service points).

        There is a training plan which includes staff involved in data-collection and
5
        reporting at all levels in the reporting process.

        All relevant staff have received training on the data management processes
6
        and tools.




                                                                                          National Level _ M_E Unit                                                                            Page 115
II- Indicator Definitions and Reporting Guidelines

        The M&E Unit has documented and shared the definition of the indicator(s)
7       with all relevant levels of the reporting system (e.g., regions, districts, service
        points).

        There is a description of the services that are related to each indicator
8
        measured by the Program/project.

        There is a written policy that states for how long source documents and
9
        reporting forms need to be retained.


        The M&E Unit has provided written guidelines to all reporting entities (e.g.,
10
        regions, districts, service points) on reporting requirements and deadlines.


    The M&E Unit has provided written guidelines to each sub-reporting level on …


11       ,,, what they are supposed to report on.


12       … how (e.g., in what specific format) reports are to be submitted.


13       … to whom the reports should be submitted.


14       … when the reports are due.


    III- Data-collection and Reporting Forms / Tools

        If multiple organizations are implementing activities under the
15      Program/project, they all use the same reporting forms and report according
        to the same reporting timelines.

        The M&E Unit has identified a standard source document (e.g., medical
16      record, client intake form, register, etc.) to be used by all service delivery
        points to record service delivery.

        The M&E Unit has identified standard reporting forms/tools to be used by all
17
        reporting levels.


18      ….The standard forms/tools are consistently used by the Service Delivery Site.


        Clear instructions have been provided by the M&E Unit on how to complete
19
        the data collection and reporting forms/tools.

        The data collected by the M&E system has sufficient precision to measure the
20      indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator
        specifies disaggregation by these characteristics).

        All source documents and reporting forms relevant for measuring the
21      indicator(s) are available for auditing purposes (including dated print-outs in
        case of computerized system).

    IV- Data Management Processes

        The M&E Unit has clearly documented data aggregation, analysis and/or
22
        manipulation steps performed at each level of the reporting system.

        Feedback is systematically provided to all sub-reporting levels on the quality
23
        of their reporting (i.e., accuracy, completeness and timeliness).

        (If applicable) There are quality controls in place for when data from paper-
24      based forms are entered into a computer (e.g., double entry, post-data entry
        verification, etc).

        (If applicable) There is a written back-up procedure for when data entry or
25
        data processing is computerized.

        ...If yes, the latest date of back-up is appropriate given the frequency of
26
        update of the computerized system (e.g., back-ups are weekly or monthly).

        Relevant personal data are maintained according to national or international
27
        confidentiality guidelines.

        The recording and reporting system avoids double counting people within and
        across Service Delivery Points (e.g., a person receiving the same service
28
        twice in a reporting period, a person registered as receiving the same service
        in two different locations, etc).

        The reporting system enables the identification and recording of a "drop out",
29
        a person "lost to follow-up" and a person who died.

        There is a written procedure to address late, incomplete, inaccurate and
30      missing reports; including following-up with sub-reporting levels on data
        quality issues.

        If data discrepancies have been uncovered in reports from sub-reporting
31      levels, the M&E Unit (e.g., districts or regions) has documented how these
        inconsistencies have been resolved.

        The M&E Unit can demonstrate that regular supervisory site visits have taken
32
        place and that data quality has been reviewed.




                                                                                              National Level _ M_E Unit   Page 116
V- Links with National Reporting System

        When applicable, the data are reported through a single channel of the
33
        national reporting system.

        When available, the relevant national forms/tools are used for data-collection
34
        and reporting.

        Reporting deadlines are harmonized with the relevant timelines of the
35
        National Program (e.g., cut-off dates for monthly reporting).


36      The service sites are identified using ID numbers that follow a national system.


        The system records information about where the service is delivered (i.e.
37
        region, district, ward, etc.)


38        ….if yes, place names are recorded using standarized naming conventions.




Part 3: Follow up Recommendations and Action Plan - M&E Unit

        Summarize key issues that the Program should follow up at various levels of the system (e.g. issues found at site level and/or at intermediate aggregation site level).


        Identified Weaknesses                                                                Description of Action Point                         Responsible(s)               Time Line

1


2


3


4




Part 4: DASHBOARD: National Level - M&E Unit


                                Data Management Assessment -                                                                         Data and Reporting Verifications -
                                          M&E Unit                                                                                              M&E Unit
                                                                                                             1200%

                                    I - M&E Structure,
                                    Functions and Capabilities

                                                                                                             1000%
                                                   3



                                                   2
                                                                                                              800%
    II- Indicator
                                                                                V - Links with
    Definitions and
                                                   1                            National
    Reporting
                                                                                Reporting System
    Guidelines
                                                                                                              600%
                                                   0



                                                                                                              400%




                                                                                                              200%

        III - Data-collection                                       IV- Data Management
        and Reporting Forms                                         Processes
        and Tools
                                                                                                                 0%
                                                                                                                       % Available          % On Time          % Complete         Verification Factor




                                                                                           National Level _ M_E Unit                                                                             Page 117
I                   II                     III                 IV                  V




                                                                                                                                                 Average
               SUMMARY TABLE




                                                                                                                                                 (per site)
      Assessment of Data Management       M&E Structure,   Indicator Definitions   Data-collection and
                                                                                                         Data Management   Links with National
          and Reporting Systems           Functions and       and Reporting        Reporting Forms /
                                                                                                            Processes      Reporting System
                                           Capabilities         Guidelines               Tools


M&E Unit


-     -                                        N/A                 N/A                    N/A                 N/A                 N/A            N/A

Regional Level


1     -                                        N/A                 N/A                    N/A                 N/A                 N/A            N/A

Intermediate Aggregation Level Sites


1     -                                        N/A                 N/A                    N/A                 N/A                 N/A            N/A

Service Delivery Points/Organizations


1     -                                        N/A                 N/A                    N/A                 N/A                 N/A            N/A


          Average (per functional area)        N/A                 N/A                    N/A                 N/A                 N/A            N/A




                                                            System Assessment Summary                                                                         Page 118
Global Dashboard - Summary Statistics, All Levels


                   Data Management Assessment - Global Aggregate Score                                  Data and Reporting Verifications - Global Aggregate Score
                                                                                              1200%


                                     M&E Structure,
                                     Functions and
                                     Capabilities
                                                                                              1000%

                                             3

                                             2                                                 800%

Indicator
                                                                        Links with
Definitions
                                                                        National Reporting
and Reporting                                1                          System
Guidelines
                                                                                               600%

                                             0

                                                                                               400%




                                                                                               200%

                Data-collection                            Data
                and Reporting                              Management
                Forms / Tools                              Processes

                                                                                                   0%
                                                                                                         % Available      % On Time        % Complete     Verification Factor




                                                                                Global Dashboard                                                                           Page 119
RDQA Final Action Plan
 Country:


 Program/project


 Date of RDQA:


 Date of Proposed Follow-up


 Description of Weakness                                    System Strengthening Measures                  Responsable(s)   Timeline    Comments




Add rows as needed




                                                          Summary of Site Specific Action Plans
 Site                             Identified Weaknesses     System Strengthening Measures                  Responsible(s)   Time line   Comments

 National Level - M&E Unit    1   -                         -                                              -                -

                              2   -                         -                                              -                -
 -                            3   -                         -                                              -                -
                              4   -                         -                                              -                -

 Regional Site 1              1   -                         -                                              -                -
                              2   -                         -                                              -                -

 -                            3   -                         -                                              -                -
                              4   -                         -                                              -                -

 District Site 1              1   -                         -                                              -                -
                              2   -                         -                                              -                -
 -                            3   -                         -                                              -                -

                              4   -                         -                                              -                -
 Service Point 1              1   -                         -                                              -                -

                              2   -                         -                                              -                -
 -                            3   -                         -                                              -                -
                              4   -                         -                                              -                -




                                                                                  RDQA Final Action Plan                                           Page 120
Systems Assessment Components Contributing to Data Quality Dimensions

                                                                                                 Level                                             Dimension of Data Quality




                                                                                                  Aggregation Levels




                                                                                                                                                                                Completeness
                                                                                                                       Service Points




                                                                                                                                                                                                              Confidentiality
                             Functional Area




                                                                                                                                                                   Timeliness
                                                                                                                                                     Reliability
                                                                                      M&E Unit




                                                                                                                                                                                               Precision
                                                                                                                                        Accuracy




                                                                                                                                                                                                                                Integrity
I - M&E Structure, Functions and Capabilities

There is a documented organizational structure/chart that clearly identifies
positions that have data management responsibilities at the M&E Unit. (to             P                                                  —             —             —
specify which Unit: e.g. MoH, NAP, GF, World Bank)

All staff positions dedicated to M&E and data management systems are
                                                                                      P                                                  —             —             —
filled.

A senior staff member (e.g., the Program Manager) is responsible for
reviewing the aggregated numbers prior to the submission/release of reports           P                                                  —             —                          —              —
from the M&E Unit.

There are designated staff responsible for reviewing the quality of data (i.e.,
accuracy, completeness, timeliness and confidentiality ) received from sub-           P             P                                    —             —             —            —              —               —
reporting levels (e.g., regions, districts, service points).

There are designated staff responsible for reviewing aggregated numbers
                                                                                                    P                   P                —             —
prior to submission to the next level (e.g., to the central M&E Unit).

The responsibility for recording the delivery of services on source documents
                                                                                                                        P                —             —
is clearly assigned to the relevant staff.

There is a training plan which includes staff involved in data-collection and
                                                                                      P                                                  —             —             —            —                              —
reporting at all levels in the reporting process.

All relevant staff have received training on the data management processes
                                                                                      P             P                   P                —             —             —            —              —               —
and tools.

II- Indicator Definitions and Reporting Guidelines

The M&E Unit has documented and shared the definition of the indicator(s)
with all relevant levels of the reporting system (e.g., regions, districts, service   P                                                  —             —
points).

There is a description of the services that are related to each indicator
                                                                                      P                                                  —             —
measured by the Program/project.

The M&E Unit has provided written guidelines to all reporting entities (e.g.,
                                                                                      P             P                   P                —             —             —            —
regions, districts, service points) on reporting requirements and deadlines.

There is a written policy that states for how long source documents and
                                                                                      P                                                  —             —             —            —              —                               —
reporting forms need to be retained.

III- Data-collection and Reporting Forms / Tools

If multiple organizations are implementing activities under the
Program/project, they all use the same reporting forms and report according           P                                                  —             —
to the same reporting timelines.




                                                           List of Survey Questions                                                                                                                        Page 121
Systems Assessment Components Contributing to Data Quality Dimensions

                                                                                             Level                                             Dimension of Data Quality




                                                                                              Aggregation Levels




                                                                                                                                                                            Completeness
                                                                                                                   Service Points




                                                                                                                                                                                                          Confidentiality
                             Functional Area




                                                                                                                                                               Timeliness
                                                                                                                                                 Reliability
                                                                                  M&E Unit




                                                                                                                                                                                           Precision
                                                                                                                                    Accuracy




                                                                                                                                                                                                                            Integrity
The M&E Unit has identified a standard source document (e.g., medical
record, client intake form, register, etc.) to be used by all service delivery    P                                                  —             —
points to record service delivery.

The M&E Unit has identified standard reporting forms/tools to be used by all
                                                                                  P             P                   P                —             —
reporting levels


….The standard forms/tools are consistently used by all levels.                   P             P                   P                —             —


Clear instructions have been provided by the M&E Unit on how to complete
                                                                                  P             P                   P                —             —
the data collection and reporting forms/tools.

The data collected by the M&E system has sufficient precision to measure
the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the      P                                 P                                                                        —
indicator specifies disaggregation by these characteristics).

All source documents and reporting forms relevant for measuring the
indicator(s) are available for auditing purposes (including dated print-outs in   P             P                   P                —             —             —            —              —                               —
case of computerized system).

IV- Data Management Processes

The M&E Unit has clearly documented data aggregation, analysis and/or
                                                                                  P                                                  —             —             —            —              —
manipulation steps performed at each level of the reporting system.

Feedback is systematically provided to all sub-reporting levels on the quality
                                                                                  P             P                                    —             —             —            —              —
of their reporting (i.e., accuracy, completeness and timeliness).

[If applicable] There are quality controls in place for when data from paper-
based forms are entered into a computer (e.g., double entry, post-data entry      P             P                   P                —             —             —            —              —                               —
verification, etc).

[If applicable] There is a written back-up procedure for when data entry or
                                                                                  P             P                   P                —             —             —            —              —                               —
data processing is computerized.


If yes, the latest date of back-up is appropriate given the frequency of update
                                                                                P               P                   P                —             —             —            —              —                               —
of the computerized system (e.g., back-ups are weekly or monthly).


Relevant personal data are maintained according to national or international
                                                                                  P             P                   P                                                                                        —
confidentiality guidelines.

The recording and reporting system avoids double counting people within
and across Service Delivery Points (e.g., a person receiving the same
                                                                               P                P                   P                —             —
service twice in a reporting period, a person registered as receiving the same
service in two different locations, etc).

The reporting system enables the identification and recording of a "drop out",
                                                                               P                P                   P                —             —
a person "lost to follow-up" and a person who died.




                                                           List of Survey Questions                                                                                                                    Page 122
Systems Assessment Components Contributing to Data Quality Dimensions

                                                                                            Level                                             Dimension of Data Quality




                                                                                             Aggregation Levels




                                                                                                                                                                           Completeness
                                                                                                                  Service Points




                                                                                                                                                                                                         Confidentiality
                           Functional Area




                                                                                                                                                              Timeliness
                                                                                                                                                Reliability
                                                                                 M&E Unit




                                                                                                                                                                                          Precision
                                                                                                                                   Accuracy




                                                                                                                                                                                                                           Integrity
There is a written procedure to address late, incomplete, inaccurate and
missing reports; including following-up with sub-reporting levels on data        P             P                                    —             —             —            —              —                               —
quality issues.

If data discrepancies have been uncovered in reports from sub-reporting
levels, the M&E Unit (e.g., districts or regions) has documented how these       P             P                                    —             —             —            —              —                               —
inconsistencies have been resolved.

The M&E Unit can demonstrate that regular supervisory site visits have
                                                                                 P                                                  —             —             —            —              —               —               —
taken place and that data quality has been reviewed.

  V- Links with National Reporting System

When available, the relevant national forms/tools are used for data-collection
                                                                                 P             P                   P                —             —                                         —                               —
and reporting.

When applicable, the data are reported through a single channel of the
                                                                                 P             P                   P                —             —                                         —                               —
national reporting system.

Reporting deadlines are harmonized with the relevant timelines of the
                                                                                 P             P                                    —             —                                         —                               —
National Program (e.g., cut-off dates for monthly reporting).

The service sites are identified using ID numbers that follow a national
                                                                                 P             P                                    —             —                                         —                               —
system.

The system records information about where the service is delivered (i.e.
                                                                                 P             P                   P                —             —                                         —                               —
region, district, ward, etc.)


….if yes, place names are recorded using standarized naming conventions.         P             P                   P                —             —                                         —                               —




                                                        List of Survey Questions                                                                                                                      Page 123

Routine data quality assessment tool june 2008

  • 1.
    Routine Data QualityAssessment (RDQA) Checklist to Assess Program/Project Data Quality Number of Regional Aggregation Sites þÿ1 Number of District Aggregation Sites þÿ1 Number of Service Delivery Sites þÿ1 Version: Jan 2010 Important notes for the use of this spreadsheet: 1. In order to use the Routine Data Quality Assessment tool you will need to ensure that your 'macro security' is set to something less than 'high'. With the spreadsheet open, go to the 'Tools' pull-down menu and select 'Macro', then 'Security'. Select 'medium'. Close Excel and re-open the file. When you open the file the next time you will have to select 'Enable Macros' for the application to work as designed. 2. On the START Page (this page), please select number of intermediate aggregation sites (IAS) and Service Delivery Points (SDPs) that you plan to review from the dropdown lists above. IAS are typically the district level health unit of the Ministry of Health. START Page 1
  • 2.
    B – INSTRUCTIONSFOR USE OF THE RDQA 1. Determine Purpose The RDQA checklist can be used for: T Initial assessment of M&E systems established by new implementing partners (or in decentralized systems) to collect, manage and report data. I Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess the functioning of the entire Program/project’s M&E system. t Periodic assessment by donors of the quality of data being provided to them (this use of the DQA could be more frequent and more streamlined than official data quality audits that use the DQA for Auditing) but less frequent than routine monitoring of data. D Preparation for a formal data quality audit. The RDQA is flexible for all of these uses. Countries and programs are encouraged to adapt the checklist to fit local program contexts. 2. Level/Site Selection Select levels and sites to be included (depending on the purpose and resources available). Once the purpose has been determined, the second step in the RDQA is to decide what levels of the data-collection and reporting system will be included in the assessment - service sites, intermediate aggregation levels, and/or central M&E unit. The levels should be determined once the appropriate reporting levels have been identified and “mapped” (e.g., there are 100 sites providing the services in 10 districts. Reports from sites are sent to districts, which then send aggregated reports to the M&E Unit). In some cases, the data flow will include more than one intermediate level (e.g. regions, provinces or states or multiple levels of program organizations). 3. Identify indicators, data sources and reporting period. The RDQA is designed to assess the quality of data and underlying systems related to indicators that are reported to programs or donors. It is necessary to select one or more indicators – or at least program areas – to serve as the subject of the RDQA. This choice will be based on the list of reported indicators. For example, a program focusing on treatment for HIV may report indicators of numbers of people on ART. Another program may focus on meeting the needs of orphans or vulnerable children, therefore the indicators for that program would be from the OVC program area. A malaria program might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria – or on both of those activities. 4. Conduct site visits. During the site visits, the relevant sections of the appropriate checklists in the Excel file are filled out (e.g. the service site checklist at service sites, etc). These checklists are completed following interviews of relevant staff and reviews of site documentation. Using the drop down lists on the HEADER page of this workbook, select the appropriate number of Intermediate Aggregation Levels (IAL) and Service Delivery Points (SDP) to be reviewed. The appropriate number of worksheets will automatically appear in the RDQA workbook (up to 12 SDP and 4 IALs). 5. Review outputs and findings. The RDQAoutputs need to be reviewed for each site visited. Site-specific summary findings in the form of recommendations are noted at each site visited. The RDQA checklists exist in MS Excel format and responses can be entered directly into the spreadsheets on the computer. Alternatively, the checklists can be printed and completed by hand. When completed electronically, a dashboard produces graphics of summary statistics for each site and level of the reporting system. The dashboard displays two (2) graphs for each site visited: - A spider-graph displays qualitative data generated from the assessment of the data-collection and reporting system and can be used to prioritize areas for improvement. - A bar-chart shows the quantitative data generated from the data verifications; these can be used to plan for data quality improvement. In addition, a 'Global Dashboard' shows statistics aggregated across and within levels to highlight overall strengths and weaknesses in the reporting system. The Global Dashboard shows a spider graph for qualitative assessments and a bar chart for quantitative assessments as above. In addition, stengths and weakness of the reporting system are displayed as dimensions of data quality in a 100% stacked bar chart. For this analysis questions are grouped by the applicable dimension of data quality (e.g. accuracy or reliability) and the number of responses by type of response (e.g. 'Yes - completely', 'Partly' etc.) are plotted as a percentage of all responses. A table of survey questions and their associated dimensions of data quality can be found on the 'Dimensions of data quality' tab in this workbook. 6. Develop a system’s strengthening plan, including follow-up actions. The final output of the RDQA is an action plan for improving data quality which describes the identified strengthening measures, the staff responsible, the timeline for completion, resources required and follow-up. Using the graphics and the detailed comments for each question, weak performing functional areas of the reporting system can be identified. Program staff can then outline strengthening measures (e.g. training, data reviews), assign responsibilities and timelines and identify resources using the Action Plan tab in this workbook. INSTRUCTIONS Page 2
  • 3.
    C – BACKGROUNDINFORMATION – RDQA Country: Name of Program/project: Indicator Reviewed: Reporting Period Verified: Assessment Team: Name Title Email Primary contact: M&E Management Unit at Central Level Name of Site Facility Code Date (mm/dd/yy) 1- Regional Level Aggregation Sites Name of Site Facility Code Region Region Code Date (mm/dd/yy) 1 District Level Aggregation Sites Name of Site Facility Code District District Code Region Region Code Date (mm/dd/yy) 1 Service Delivery Points (SDPs) Name of Site Facility Code District District Code Region Region Code Date (mm/dd/yy) 1 Information_Page Page 3
  • 4.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 1 Page 4
  • 5.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 1 Page 5
  • 6.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and 1.00 National Reporting Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 1 Page 6
  • 7.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 2 Page 7
  • 8.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 2 Page 8
  • 9.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 2 Page 9
  • 10.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 3 Page 10
  • 11.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 3 Page 11
  • 12.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 3 Page 12
  • 13.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 4 Page 13
  • 14.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 4 Page 14
  • 15.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 4 Page 15
  • 16.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 5 Page 16
  • 17.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 5 Page 17
  • 18.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 5 Page 18
  • 19.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 6 Page 19
  • 20.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 6 Page 20
  • 21.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 6 Page 21
  • 22.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 7 Page 22
  • 23.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 7 Page 23
  • 24.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 7 Page 24
  • 25.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 8 Page 25
  • 26.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 8 Page 26
  • 27.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 8 Page 27
  • 28.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 9 Page 28
  • 29.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 9 Page 29
  • 30.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 9 Page 30
  • 31.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 10 Page 31
  • 32.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 10 Page 32
  • 33.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 10 Page 33
  • 34.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 11 Page 34
  • 35.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 11 Page 35
  • 36.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 11 Page 36
  • 37.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 12 Page 37
  • 38.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 12 Page 38
  • 39.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 12 Page 39
  • 40.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 13 Page 40
  • 41.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 13 Page 41
  • 42.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 13 Page 42
  • 43.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 14 Page 43
  • 44.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 14 Page 44
  • 45.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 14 Page 45
  • 46.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 15 Page 46
  • 47.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 15 Page 47
  • 48.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 15 Page 48
  • 49.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 16 Page 49
  • 50.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 16 Page 50
  • 51.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 16 Page 51
  • 52.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 17 Page 52
  • 53.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 17 Page 53
  • 54.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 17 Page 54
  • 55.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 18 Page 55
  • 56.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 18 Page 56
  • 57.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 18 Page 57
  • 58.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 19 Page 58
  • 59.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 19 Page 59
  • 60.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 19 Page 60
  • 61.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 20 Page 61
  • 62.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 20 Page 62
  • 63.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 20 Page 63
  • 64.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 21 Page 64
  • 65.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 21 Page 65
  • 66.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 21 Page 66
  • 67.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 22 Page 67
  • 68.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 22 Page 68
  • 69.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 22 Page 69
  • 70.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 23 Page 70
  • 71.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 23 Page 71
  • 72.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 23 Page 72
  • 73.
    Data Verification andSystem Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 24 Page 73
  • 74.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 24 Page 74
  • 75.
    Part 3: Recommendationsfor the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 24 Page 75
  • 76.
    Service Delivery SiteSummary Statistics Data Management Assessment - Data and Reporting Verifications - Service Site Summary Service Site Summary 1 M&E Structure, 0.9 Functions and Capabilities 0.8 3 0.7 2 0.6 Indicator Number of Sites Links with Definitions National Reporting and Reporting 1 System 0.5 Guidelines 0 0.4 0.3 0.2 0.1 Data-collection Data and Reporting Management Forms / Tools Processes 0 <=70 71-80 81-90 91-100 101-110 111-120 121-130 >130 Percent Accuracy Service Site Summary Page 76
  • 77.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 1 Page 77
  • 78.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 1 Page 78
  • 79.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - Data and Reporting Verifications - District Site District Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 1 Page 79
  • 80.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 2 Page 80
  • 81.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 2 Page 81
  • 82.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - Data and Reporting Verifications - District Site District Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 2 Page 82
  • 83.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 3 Page 83
  • 84.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 3 Page 84
  • 85.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - Data and Reporting Verifications - District Site District Site 1200% I - M&E Structure, Functions and Capabilities 1000% 10 9 8 7 6 800% II- Indicator 5 V - Links with Definitions and 4 National Reporting 3 Reporting System Guidelines 2 1 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 3 Page 85
  • 86.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 4 Page 86
  • 87.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 4 Page 87
  • 88.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - Data and Reporting Verifications - District Site District Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 4 Page 88
  • 89.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 5 Page 89
  • 90.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 5 Page 90
  • 91.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - - Data Management Assessment Data and Reporting Verifications - Verifications - Data and Reporting District Site District Site District Site District Site I - M&E Structure, 1200% 1200% Functions and Capabilities I - M&E Structure, 10 Functions and 9 Capabilities 1000% 8 1000% 7 3 6 II- Indicator 5 800% V - Links with Definitions and 4 2 National Reporting 3 800% Reporting System Guidelines II- Indicator 2 V - Links with 600% Definitions and 1 National Reporting 1 0 Reporting System Guidelines 600% 0 400% 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes 0% and Tools 200% Verification factor % Available % On Time % Complete III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 5 Page 91
  • 92.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 6 Page 92
  • 93.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 6 Page 93
  • 94.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - - Data Management Assessment Data and Data andVerifications -Verifications Reporting Reporting - District Site District Site District SiteDistrict Site I - M&E Structure, 1200% 120% Functions and Capabilities 10 I - M&E Structure, 1000% Functions and 9 Capabilities 8 100% 7 10 6 II- Indicator 9 5 800% 8 V - Links with Definitions and 4 7 National Reporting 3 6 Reporting System 80% Guidelines 2 II- Indicator 5 1 V - Links with 600% Definitions and 4 0 National Reporting 3 Reporting System Guidelines 2 1 60% 400% 0 200% 40% III - Data-collection IV- Data Management and Reporting Forms Processes 0% and Tools 20% Verification factor % Available % On Time % Complete III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 6 Page 94
  • 95.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 7 Page 95
  • 96.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 7 Page 96
  • 97.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - - Data Management Assessment Data and Reporting Verifications - Verifications - Data and Reporting District Site District Site District Site District Site I - M&E Structure, 1200% 120% Functions and Capabilities I - M&E Structure, 10 Functions and 9 Capabilities 1000% 8 100% 7 10 6 II- Indicator 5 800% 8 V - Links with Definitions and 4 National Reporting 3 6 80% Reporting System Guidelines II- Indicator 2 V - Links with 600% Definitions and 1 4 National Reporting 0 Reporting System Guidelines 2 60% 0 400% 40% 200% III - Data-collection IV- Data Management and Reporting Forms Processes 0% and Tools 20% Verification factor % Available % On Time % Complete III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 7 Page 97
  • 98.
    Data Verification andSystem Assessment Sheet - District Site District Site/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 District (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all service sites? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. District Site 8 Page 98
  • 99.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. V - Links with National Reporting System When applicable, the data are reported through a single channel of the 17 national reporting system. When available, the relevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. District Site 8 Page 99
  • 100.
    Part 3: Recommendationsfor the District Site Based on the findings of the systems’ review and data verification at the District site, please describe any compliance requirements or recommended strengthening measures, with an estima length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system. Action points should be discussed with the Pro Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: District Site Data Management Assessment - Data and Reporting Verifications - District Site District Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete District Site 8 Page 100
  • 101.
    District Site SummaryStatistics Data Management Assessment - District Level Data and Reporting Verifications - District Level Summary Summary 1200% M&E Structure, Functions and Capabilities 1000% 3 800% 2 Indicator Links with Definitions National Reporting and Reporting 1 System Guidelines 600% 0 400% 200% Data-collection Data and Reporting Management Forms / Tools Processes 0% % Available % On Time % Complete Verification Factor District Summary Page 101
  • 102.
    Data Verification andSystem Assessment Sheet - Regional Site Regional Site/Organization: - Region: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 Intermediate Aggregation Site (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all Districts? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Regional Site 1 Page 102
  • 103.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. Regional Site 1 Page 103
  • 104.
    When available, therelevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. Part 3: Recommendations for the Intermediate Aggregation Level Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: Intermediate Aggregation Level Data Management Assessment - Data and Reporting Verifications - Regional Site Regional Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete Regional Site 1 Page 104
  • 105.
    Data Verification andSystem Assessment Sheet - Regional Site Regional Site/Organization: - Region: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 Intermediate Aggregation Site (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all Districts? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Regional Site 2 Page 105
  • 106.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. Regional Site 2 Page 106
  • 107.
    When available, therelevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. Part 3: Recommendations for the Intermediate Aggregation Level Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: Intermediate Aggregation Level Data Management Assessment - Data and Reporting Verifications - Regional Site Regional Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete Regional Site 2 Page 107
  • 108.
    Data Verification andSystem Assessment Sheet - Regional Site Regional Site/Organization: - Region: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 Intermediate Aggregation Site (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all Districts? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Regional Site 3 Page 108
  • 109.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. Regional Site 3 Page 109
  • 110.
    When available, therelevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. Part 3: Recommendations for the Intermediate Aggregation Level Based on the findings of the systems’ review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measur an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be disc with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: Intermediate Aggregation Level Data Management Assessment - Data and Reporting Verifications - Regional Site Regional Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete Regional Site 3 Page 110
  • 111.
    Data Verification andSystem Assessment Sheet - Regional Site Regional Site/Organization: - Region: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - REVIEWER COMMENTS completely Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Deta responses will help guide strengthening measures. ) No - not at all N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all Service Delivery 1 Points. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 Intermediate Aggregation Site (and submitted to the next reporting level)? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete? 5 How many reports should there have been from all Districts? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Regional Site 4 Page 111
  • 112.
    Part 2. SystemsAssessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing the quality of data (i.e., 1 accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points). There are designated staff responsible for reviewing aggregated numbers 2 prior to submission to the next level (e.g., to the central M&E Unit). All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III- Data-collection and Reporting Forms / Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels ….The standard forms/tools are consistently used by the Service Delivery 10 Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes Feedback is systematically provided to all service points on the quality of their 12 reporting (i.e., accuracy, completeness and timeliness). If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update 15 of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 19 missing reports; including following-up with service points on data quality issues. If data discrepancies have been uncovered in reports from service points, the 20 Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. Regional Site 4 Page 112
  • 113.
    When available, therelevant national forms/tools are used for data-collection 21 and reporting. The system records information about where the service is delivered (i.e. 22 region, district, ward, etc.) 23 ….if yes, place names are recorded using standarized naming conventions. Part 3: Recommendations for the Intermediate Aggregation Level Based on the findings of the systems’ review and data verification at the Regional site, please describe any compliance requirements or recommended strengthening measures, with an estim the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system. Action points should be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) 1 2 3 4 Part 4: DASHBOARD: Intermediate Aggregation Level Data Management Assessment - Data and Reporting Verifications - Regional Site Regional Site 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and National Reporting 1 Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification factor % Available % On Time % Complete Regional Site 4 Page 113
  • 114.
    Regional Site SummaryStatistics Data Management Assessment - Data and Reporting Verifications - Regional Level Summary Regional Level Summary 1200% M&E Structure, Functions and Capabilities 1000% 3 800% 2 Indicator Links with Definitions National Reporting and Reporting 1 System Guidelines 600% 0 400% 200% Data-collection Data and Reporting Management Forms / Tools Processes 0% % Available % On Time % Complete Verification Factor Regional Summary Page 114
  • 115.
    Data Verification andSystem Assessment Sheet - National Level M&E Unit National Level M&E Unit/Organization: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Recounting reported Results: Recount results from the periodic reports sent from the intermediate aggregation sites to the National Level and compare to the value published by the National Program (or reported by the National Program to the Donor, if applicable). Explain discrepancies (if any). Re-aggregate the numbers from the reports received from all reporting 1 entities. What is the re-aggregated number? [A] What aggregated result was contained in the summary report prepared by the 2 M&E Unit? [B] 3 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 4 errors, arithmetic errors, missing source documents, other)? B - Reporting Performance: Review availability, completeness, and timeliness of reports from all Intermediate Aggregation Sites. How many reports should there have been from all Aggregation Sites? How many are there? Were they received on time? Are they complete? How many reports should there have been from all reporting entities (e.g., 5 regions, districts, service points)? [A] 6 How many reports are there? [B] 7 Calculate % Available Reports [B/A] - Check the dates on the reports received. How many reports were received 8 on time? (i.e., received by the due date). [C] 9 Calculate % On time Reports [C/A] - How many reports were complete? (i.e., complete means that the report 10 contained all the required indicator data*). [D] 11 Calculate % Complete Reports [D/A] - Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There is a documented organizational structure/chart that clearly identifies 1 positions that have data management responsibilities at the M&E Unit. (to specify which Unit: e.g. MoH, NAP, GF, World Bank) 2 All staff positions dedicated to M&E and data management systems are filled. A senior staff member (e.g., the Program Manager) is responsible for 3 reviewing the aggregated numbers prior to the submission/release of reports from the M&E Unit. There are designated staff responsible for reviewing the quality of data (i.e., 4 accuracy, completeness, timeliness and confidentiality ) received from sub- reporting levels (e.g., regions, districts, service points). There is a training plan which includes staff involved in data-collection and 5 reporting at all levels in the reporting process. All relevant staff have received training on the data management processes 6 and tools. National Level _ M_E Unit Page 115
  • 116.
    II- Indicator Definitionsand Reporting Guidelines The M&E Unit has documented and shared the definition of the indicator(s) 7 with all relevant levels of the reporting system (e.g., regions, districts, service points). There is a description of the services that are related to each indicator 8 measured by the Program/project. There is a written policy that states for how long source documents and 9 reporting forms need to be retained. The M&E Unit has provided written guidelines to all reporting entities (e.g., 10 regions, districts, service points) on reporting requirements and deadlines. The M&E Unit has provided written guidelines to each sub-reporting level on … 11 ,,, what they are supposed to report on. 12 … how (e.g., in what specific format) reports are to be submitted. 13 … to whom the reports should be submitted. 14 … when the reports are due. III- Data-collection and Reporting Forms / Tools If multiple organizations are implementing activities under the 15 Program/project, they all use the same reporting forms and report according to the same reporting timelines. The M&E Unit has identified a standard source document (e.g., medical 16 record, client intake form, register, etc.) to be used by all service delivery points to record service delivery. The M&E Unit has identified standard reporting forms/tools to be used by all 17 reporting levels. 18 ….The standard forms/tools are consistently used by the Service Delivery Site. Clear instructions have been provided by the M&E Unit on how to complete 19 the data collection and reporting forms/tools. The data collected by the M&E system has sufficient precision to measure the 20 indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies disaggregation by these characteristics). All source documents and reporting forms relevant for measuring the 21 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). IV- Data Management Processes The M&E Unit has clearly documented data aggregation, analysis and/or 22 manipulation steps performed at each level of the reporting system. Feedback is systematically provided to all sub-reporting levels on the quality 23 of their reporting (i.e., accuracy, completeness and timeliness). (If applicable) There are quality controls in place for when data from paper- 24 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). (If applicable) There is a written back-up procedure for when data entry or 25 data processing is computerized. ...If yes, the latest date of back-up is appropriate given the frequency of 26 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 27 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 28 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 29 a person "lost to follow-up" and a person who died. There is a written procedure to address late, incomplete, inaccurate and 30 missing reports; including following-up with sub-reporting levels on data quality issues. If data discrepancies have been uncovered in reports from sub-reporting 31 levels, the M&E Unit (e.g., districts or regions) has documented how these inconsistencies have been resolved. The M&E Unit can demonstrate that regular supervisory site visits have taken 32 place and that data quality has been reviewed. National Level _ M_E Unit Page 116
  • 117.
    V- Links withNational Reporting System When applicable, the data are reported through a single channel of the 33 national reporting system. When available, the relevant national forms/tools are used for data-collection 34 and reporting. Reporting deadlines are harmonized with the relevant timelines of the 35 National Program (e.g., cut-off dates for monthly reporting). 36 The service sites are identified using ID numbers that follow a national system. The system records information about where the service is delivered (i.e. 37 region, district, ward, etc.) 38 ….if yes, place names are recorded using standarized naming conventions. Part 3: Follow up Recommendations and Action Plan - M&E Unit Summarize key issues that the Program should follow up at various levels of the system (e.g. issues found at site level and/or at intermediate aggregation site level). Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: National Level - M&E Unit Data Management Assessment - Data and Reporting Verifications - M&E Unit M&E Unit 1200% I - M&E Structure, Functions and Capabilities 1000% 3 2 800% II- Indicator V - Links with Definitions and 1 National Reporting Reporting System Guidelines 600% 0 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% % Available % On Time % Complete Verification Factor National Level _ M_E Unit Page 117
  • 118.
    I II III IV V Average SUMMARY TABLE (per site) Assessment of Data Management M&E Structure, Indicator Definitions Data-collection and Data Management Links with National and Reporting Systems Functions and and Reporting Reporting Forms / Processes Reporting System Capabilities Guidelines Tools M&E Unit - - N/A N/A N/A N/A N/A N/A Regional Level 1 - N/A N/A N/A N/A N/A N/A Intermediate Aggregation Level Sites 1 - N/A N/A N/A N/A N/A N/A Service Delivery Points/Organizations 1 - N/A N/A N/A N/A N/A N/A Average (per functional area) N/A N/A N/A N/A N/A N/A System Assessment Summary Page 118
  • 119.
    Global Dashboard -Summary Statistics, All Levels Data Management Assessment - Global Aggregate Score Data and Reporting Verifications - Global Aggregate Score 1200% M&E Structure, Functions and Capabilities 1000% 3 2 800% Indicator Links with Definitions National Reporting and Reporting 1 System Guidelines 600% 0 400% 200% Data-collection Data and Reporting Management Forms / Tools Processes 0% % Available % On Time % Complete Verification Factor Global Dashboard Page 119
  • 120.
    RDQA Final ActionPlan Country: Program/project Date of RDQA: Date of Proposed Follow-up Description of Weakness System Strengthening Measures Responsable(s) Timeline Comments Add rows as needed Summary of Site Specific Action Plans Site Identified Weaknesses System Strengthening Measures Responsible(s) Time line Comments National Level - M&E Unit 1 - - - - 2 - - - - - 3 - - - - 4 - - - - Regional Site 1 1 - - - - 2 - - - - - 3 - - - - 4 - - - - District Site 1 1 - - - - 2 - - - - - 3 - - - - 4 - - - - Service Point 1 1 - - - - 2 - - - - - 3 - - - - 4 - - - - RDQA Final Action Plan Page 120
  • 121.
    Systems Assessment ComponentsContributing to Data Quality Dimensions Level Dimension of Data Quality Aggregation Levels Completeness Service Points Confidentiality Functional Area Timeliness Reliability M&E Unit Precision Accuracy Integrity I - M&E Structure, Functions and Capabilities There is a documented organizational structure/chart that clearly identifies positions that have data management responsibilities at the M&E Unit. (to P — — — specify which Unit: e.g. MoH, NAP, GF, World Bank) All staff positions dedicated to M&E and data management systems are P — — — filled. A senior staff member (e.g., the Program Manager) is responsible for reviewing the aggregated numbers prior to the submission/release of reports P — — — — from the M&E Unit. There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness, timeliness and confidentiality ) received from sub- P P — — — — — — reporting levels (e.g., regions, districts, service points). There are designated staff responsible for reviewing aggregated numbers P P — — prior to submission to the next level (e.g., to the central M&E Unit). The responsibility for recording the delivery of services on source documents P — — is clearly assigned to the relevant staff. There is a training plan which includes staff involved in data-collection and P — — — — — reporting at all levels in the reporting process. All relevant staff have received training on the data management processes P P P — — — — — — and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has documented and shared the definition of the indicator(s) with all relevant levels of the reporting system (e.g., regions, districts, service P — — points). There is a description of the services that are related to each indicator P — — measured by the Program/project. The M&E Unit has provided written guidelines to all reporting entities (e.g., P P P — — — — regions, districts, service points) on reporting requirements and deadlines. There is a written policy that states for how long source documents and P — — — — — — reporting forms need to be retained. III- Data-collection and Reporting Forms / Tools If multiple organizations are implementing activities under the Program/project, they all use the same reporting forms and report according P — — to the same reporting timelines. List of Survey Questions Page 121
  • 122.
    Systems Assessment ComponentsContributing to Data Quality Dimensions Level Dimension of Data Quality Aggregation Levels Completeness Service Points Confidentiality Functional Area Timeliness Reliability M&E Unit Precision Accuracy Integrity The M&E Unit has identified a standard source document (e.g., medical record, client intake form, register, etc.) to be used by all service delivery P — — points to record service delivery. The M&E Unit has identified standard reporting forms/tools to be used by all P P P — — reporting levels ….The standard forms/tools are consistently used by all levels. P P P — — Clear instructions have been provided by the M&E Unit on how to complete P P P — — the data collection and reporting forms/tools. The data collected by the M&E system has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the P P — indicator specifies disaggregation by these characteristics). All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in P P P — — — — — — case of computerized system). IV- Data Management Processes The M&E Unit has clearly documented data aggregation, analysis and/or P — — — — — manipulation steps performed at each level of the reporting system. Feedback is systematically provided to all sub-reporting levels on the quality P P — — — — — of their reporting (i.e., accuracy, completeness and timeliness). [If applicable] There are quality controls in place for when data from paper- based forms are entered into a computer (e.g., double entry, post-data entry P P P — — — — — — verification, etc). [If applicable] There is a written back-up procedure for when data entry or P P P — — — — — — data processing is computerized. If yes, the latest date of back-up is appropriate given the frequency of update P P P — — — — — — of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international P P P — confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same P P P — — service twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", P P P — — a person "lost to follow-up" and a person who died. List of Survey Questions Page 122
  • 123.
    Systems Assessment ComponentsContributing to Data Quality Dimensions Level Dimension of Data Quality Aggregation Levels Completeness Service Points Confidentiality Functional Area Timeliness Reliability M&E Unit Precision Accuracy Integrity There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with sub-reporting levels on data P P — — — — — — quality issues. If data discrepancies have been uncovered in reports from sub-reporting levels, the M&E Unit (e.g., districts or regions) has documented how these P P — — — — — — inconsistencies have been resolved. The M&E Unit can demonstrate that regular supervisory site visits have P — — — — — — — taken place and that data quality has been reviewed. V- Links with National Reporting System When available, the relevant national forms/tools are used for data-collection P P P — — — — and reporting. When applicable, the data are reported through a single channel of the P P P — — — — national reporting system. Reporting deadlines are harmonized with the relevant timelines of the P P — — — — National Program (e.g., cut-off dates for monthly reporting). The service sites are identified using ID numbers that follow a national P P — — — — system. The system records information about where the service is delivered (i.e. P P P — — — — region, district, ward, etc.) ….if yes, place names are recorded using standarized naming conventions. P P P — — — — List of Survey Questions Page 123